Knowledge Transfer between Neural Networks

نویسندگان

  • Kenneth McGarry
  • John MacIntyre
چکیده

The goal of knowledge transfer is to take advantage of previous training experience to solve related but new tasks. This paper tackles the issue of transfer of knowledge between radial basis function neural networks. We present some preliminary work illustrating how a neural network trained on one task (the source) can be used to assist in the synthesis of a new but similar task (the target).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gyroscope Random Drift Modeling, using Neural Networks, Fuzzy Neural and Traditional Time- series Methods

In this paper statistical and time series models are used for determining the random drift of a dynamically Tuned Gyroscope (DTG). This drift is compensated with optimal predictive transfer function. Also nonlinear neural-network and fuzzy-neural models are investigated for prediction and compensation of the random drift. Finally the different models are compared together and their advantages a...

متن کامل

Estimating the effect of organizational structure on knowledge transfer: A neural network approach

Artificial neural network has been put into abundant applications in social science research recently. In this study, we investigate the topological structures of organization network, which can possibly account for the different performances of intra-organizational knowledge transfer. We construct two types of networks including hierarchy and scale-free networks, and single-layer perceptron mo...

متن کامل

Progressive Neural Networks for Transfer Learning in Emotion Recognition

Many paralinguistic tasks are closely related and thus representations learned in one domain can be leveraged for another. In this paper, we investigate how knowledge can be transferred between three paralinguistic tasks: speaker, emotion, and gender recognition. Further, we extend this problem to cross-dataset tasks, asking how knowledge captured in one emotion dataset can be transferred to an...

متن کامل

Born Again Neural Networks

Knowledge distillation techniques seek to transfer knowledge acquired by a learned teacher model to a new student model. In prior work, the teacher typically is a high-capacity model with formidable performance, while the student is more compact. By transferring knowledge, one hopes to benefit from the student’s compactness while suffering only minimal degradation in performance. In this paper,...

متن کامل

Like What You Like: Knowledge Distill via Neuron Selectivity Transfer

Despite deep neural networks have demonstrated extraordinary power in various applications, their superior performances are at expenses of high storage and computational cost. Consequently, the acceleration and compression of neural networks have attracted much attention recently. Knowledge Transfer (KT), which aims at training a smaller student network by transferring knowledge from a larger t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002